Jeffrey's rule of conditioning generalized to belief functions
نویسنده
چکیده
Jeffrey's rule of conditioning has been proposed in order to revise a probability measure by another probability function. We generalize it within the framework of the models based on belief functions. We show that several forms of Jeffrey's conditionings can be defined that correspond to the geometrical rule of conditioning and to Dempster's rule of conditioning, respectively. 1. Jeffrey's rule in probability theory. In probability theory conditioning on an event . is classically obtained by the application of Bayes' rule. Let (Q, � , P) be a probability space where P(A) is the probability of the event Ae � where� is a Boolean algebra defined on a finite2 set n. P(A) quantified the degree of belief or the objective probability, depending on the interpretation given to the probability measure, that a particular arbitrary element m of n which is not a priori located in any of the sets of� belongs to a particular set Ae�. Suppose it is known that m belongs to Be� and P(B)>O. The probability measure P must be updated into PB that quantifies the same event as previously but after taking in due consideration the know ledge that me B. PB is obtained by Bayes' rule of conditioning: This rule can be obtained by requiring that: 81: VBE�. PB(B) = 1 82: VBe�, VX,Ye� such that X.Y�B. and PJ3(X) _ P(X) PB(Y)P(Y) PB(Y) = 0 ifP(Y)>O
منابع مشابه
Conditioning in Dempster-Shafer Theory: Prediction vs. Revision
We recall the existence of two methods for conditioning belief functions due to Dempster: one, known as Dempster conditioning, that applies Bayesian conditioning to the plausibility function and one that performs a sensitivity analysis on a conditional probability. We recall that while the first one is dedicated to revising a belief function, the other one is tailored to a prediction problem wh...
متن کاملA General Framework for Revising Belief Bases Using Qualitative Jeffrey's Rule
Intelligent agents require methods to revise their epistemic state as they acquire new information. Jeffrey’s rule, which extends conditioning to uncertain inputs, is used to revise probabilistic epistemic states when new information is uncertain. This paper analyses the expressive power of two possibilistic counterparts of Jeffrey’s rule for modeling belief revision in intelligent agents. We s...
متن کاملA Framework for Iterated Belief Revision Using Possibilistic Counterparts to Jeffrey's Rule
Intelligent agents require methods to revise their epistemic state as they acquire new information. Jeffrey’s rule, which extends conditioning to probabilistic inputs, is appropriate for revising probabilistic epistemic states when new information comes in the form of a partition of events with new probabilities and has priority over prior beliefs. This paper analyses the expressive power of tw...
متن کاملBelief revision generalized: A joint characterization of Bayes' and Jeffrey's rules
We present a general framework for representing belief-revision rules and use it to characterize Bayess rule as a classical example and Je¤reys rule as a non-classical one. In Je¤reys rule, the input to a belief revision is not simply the information that some event has occurred, as in Bayess rule, but a new assignment of probabilities to some events. Despite their di¤erences, Bayess and J...
متن کاملClassical Belief Conditioning and its Generalization to DSm Theory
Brief introductions to both Dempster-Shafer and DSm theories are presented. Classical belief conditioning is recalled and generalized to DSm hyper-power sets. Relation of generalization of classic conditioning rules to belief conditioning defined in DSmT is discussed. c ©2008 World Academic Press, UK. All rights reserved.
متن کامل